Kernel Entropy Component Analysis with Nongreedy L1-Norm Maximization
نویسندگان
چکیده
منابع مشابه
Robust Principal Component Analysis with Non-Greedy l1-Norm Maximization
Principal Component Analysis (PCA) is one of the most important methods to handle highdimensional data. However, the high computational complexitymakes it hard to apply to the large scale data with high dimensionality, and the used 2-norm makes it sensitive to outliers. A recent work proposed principal component analysis based on 1-normmaximization, which is efficient and robust to outliers. In...
متن کاملL1-norm Kernel PCA
We present the first model and algorithm for L1-norm kernel PCA. While L2-norm kernel PCA has been widely studied, there has been no work on L1-norm kernel PCA. For this non-convex and non-smooth problem, we offer geometric understandings through reformulations and present an efficient algorithm where the kernel trick is applicable. To attest the efficiency of the algorithm, we provide a conver...
متن کاملOptimal sparse L1-norm principal-component analysis
We present an algorithm that computes exactly (optimally) the S-sparse (1≤S<D) maximum-L1-norm-projection principal component of a real-valued data matrix X ∈ RD×N that contains N samples of dimension D. For fixed sample support N , the optimal L1-sparse algorithm has linear complexity in data dimension, O (D). For fixed dimension D (thus, fixed sparsity S), the optimal L1-sparse algorithm has ...
متن کاملPrincipal Component Analysis by $L_{p}$ -Norm Maximization
This paper proposes several principal component analysis (PCA) methods based on Lp-norm optimization techniques. In doing so, the objective function is defined using the Lp-norm with an arbitrary p value, and the gradient of the objective function is computed on the basis of the fact that the number of training samples is finite. In the first part, an easier problem of extracting only one featu...
متن کاملNon-Greedy L21-Norm Maximization for Principal Component Analysis
Principal Component Analysis (PCA) is one of the most important unsupervised methods to handle highdimensional data. However, due to the high computational complexity of its eigen decomposition solution, it hard to apply PCA to the large-scale data with high dimensionality. Meanwhile, the squared L2-norm based objective makes it sensitive to data outliers. In recent research, the L1-norm maximi...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational Intelligence and Neuroscience
سال: 2018
ISSN: 1687-5265,1687-5273
DOI: 10.1155/2018/6791683